AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
16k Long Context

# 16k Long Context

Qwen2.5 Math 7B 16k Think
MIT
An improved model based on Qwen2.5-Math-7B, with extended context window and optimized reasoning capabilities
Large Language Model Transformers
Q
Elliott
3,496
1
Jais Family 30b 16k Chat
Apache-2.0
The Jais series is a bilingual large language model optimized for Arabic, while also possessing strong English capabilities. The 30B-16K version has 30 billion parameters and supports a context length of 16,384 tokens.
Large Language Model Safetensors Supports Multiple Languages
J
inceptionai
59
12
Chat2db SQL 7B
Apache-2.0
A 7-billion-parameter model fine-tuned on CodeLlama, specifically designed for natural language to SQL tasks, supporting multiple SQL dialects and 16k context length processing
Large Language Model Transformers Supports Multiple Languages
C
Chat2DB
382
51
Codellama 70b Instruct Hf
Code Llama is a series of pre-trained and fine-tuned generative text models specifically designed for general-purpose code synthesis and understanding. This model is the 70-billion-parameter instruction-tuned version.
Large Language Model Transformers Other
C
codellama
8,108
208
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase